Learning better discourse representation for implicit discourse relation recognition via attention networks
نویسندگان
چکیده
Humans comprehend the meanings and relations of discourses heavily relying on their semantic memory that encodes general knowledge about concepts and facts. Inspired by this, we propose a neural recognizer for implicit discourse relation analysis, which builds upon a semantic memory that stores knowledge in a distributed fashion. We refer to this recognizer as SeMDER. Starting from word embeddings of discourse arguments, SeMDER employs a shallow encoder to generate a distributed surface representation for a discourse. A semantic encoder with attention to the semantic memory matrix is further established over surface representations. It is able to retrieve a deep semantic meaning representation for the discourse from the memory. Using the surface and semantic representations as input, SeMDER finally predicts implicit discourse relations via a neural recognizer. Experiments on the benchmark data set show that SeMDER benefits from the semantic memory and achieves substantial improvements of 2.56% on average over current state-of-the-art baselines in terms of F1-score.
منابع مشابه
SWIM: A Simple Word Interaction Model for Implicit Discourse Relation Recognition
Capturing the semantic interaction of pairs of words across arguments and proper argument representation are both crucial issues in implicit discourse relation recognition. The current state-ofthe-art represents arguments as distributional vectors that are computed via bi-directional Long Short-Term Memory networks (BiLSTMs), known to have significant model complexity. In contrast, we demonstra...
متن کاملLeveraging Synthetic Discourse Data via Multi-task Learning for Implicit Discourse Relation Recognition
To overcome the shortage of labeled data for implicit discourse relation recognition, previous works attempted to automatically generate training data by removing explicit discourse connectives from sentences and then built models on these synthetic implicit examples. However, a previous study (Sporleder and Lascarides, 2008) showed that models trained on these synthetic data do not generalize ...
متن کاملRecognizing Implicit Discourse Relations through Abductive Reasoning with Large-scale Lexical Knowledge
Discourse relation recognition is the task of identifying the semantic relationships between textual units. Conventional approaches to discourse relation recognition exploit surface information and syntactic information as machine learning features. However, the performance of these models is severely limited for implicit discourse relation recognition. In this paper, we propose an abductive th...
متن کاملImplicit Discourse Relation Classification via Multi-Task Neural Networks
Without discourse connectives, classifying implicit discourse relations is a challenging task and a bottleneck for building a practical discourse parser. Previous research usually makes use of one kind of discourse framework such as PDTB or RST to improve the classification performance on discourse relations. Actually, under different discourse annotation frameworks, there exist multiple corpor...
متن کاملMemory Augmented Attention Model for Chinese Implicit Discourse Relation Recognition
Recently, Chinese implicit discourse relation recognition has attracted more and more attention, since it is crucial to understand the Chinese discourse text. In this paper, we propose a novel memory augmented attention model which represents the arguments using an attention-based neural network and preserves the crucial information with an external memory network which captures each discourse ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neurocomputing
دوره 275 شماره
صفحات -
تاریخ انتشار 2018